Technalysis Research
 
Previous USAToday Columns

June 28, 2016
Your average car is a lot more code-driven than you think

June 11, 2016
Augmented reality comes to phones — and kitchens

May 18, 2016
The future of computing will be ambient and invisible

May 3, 2016
The hottest new technologies are coming to cars

April 22, 2016
The shifting landscape of tech platforms, services

April 10, 2016
It's time for upgradable cars: O'Donnell

March 31, 2016
Forget 4K, It's Time for UltraHD

March 24, 2016
AR and VR Driving Major Innovations in Tech

February 24, 2016
The why's and what's of 5G

February 17, 2016
Dark clouds over cloud services reflect pull of legacy technology

January 25, 2016
Biometrics is the latest shield against password hacks

January 6, 2016
Navigating the in-car tech experience

2015 USAToday Columns

2014 USAToday Columns

















USAToday Column


July 19, 2016
Is semi-autonomous driving really viable?

By Bob O'Donnell

FOSTER CITY, Calif.— The recent crash of Tesla Model S under Autopilot control has raised some serious concerns about the safety of autonomous driving features on Teslas, in particular, and all cars in general. The US National Highway Traffic Safety Administration (NHTSA)—the organization that offers the 5-star safety rating systems for new cars—is investigating the details of the unfortunate incident and may come up with more guidelines in this area, which many people believe is severely lacking in any real oversight.

Much has already been written on the issue, but everything I’ve seen has ignored the key question that this incident has brought to our attention. Is it really reasonable or safe to offer a semi-autonomous driving mode, where a driver temporarily gives over complete control of an auto to computer-controlled systems within the car, but then needs to take it back under certain situations (such as a potential safety hazard)?

To put it in the language of NHTSA and their guidelines for the development of autonomous driving technology, should there really be a Level 3 for autonomous driving? Increasingly, I believe the answer is no.

For those who don’t know, NHTSA has a five-level set of guidelines for autonomous driving (numbered 0 through 4), that lays out an orderly progression of technological advancements. Level 0 has absolutely no degree of automation—the driver is in complete control of everything. Level 1 offers some basic technological help, such as electronic stability control or pre-charged brakes—nearly all cars on the road today at least meet this level. Level 2 is called Combined Function Automation, and it specifies the use of several functions at once, such as automated cruise controls or lane centering features that some newer cars have. Level 3 allows the car to take control of the driving in certain limited situations, such as highway-driving only, but drivers are still expected to ultimately be in control. Finally, Level 4 is for cars that require absolutely no intervention from a driver/passenger and can get from point A to point B without assistance.

In reality, some of the newer autonomous driving features—such as automatic safety braking, automatic parking and other similarly advanced features—are essentially level 2.5. They’re more advanced than basic cruise controls, but aren’t truly autonomous.

Many believe that Tesla’s Autopilot feature starts to venture into Level 3 territory—it offers a certain degree of autonomy in certain environments. The problem is that Level 3 requires people to maintain control of the system, yet it essentially lulls people into thinking (and acting) like it’s Level 4. YouTube is already filled with videos from the small sample of existing Tesla owners who treat driving with AutoPilot as a true intervention-free driving mode: reading newspapers, sitting in the back seat (!), playing games and doing other activities that would be (theoretically) fine in Level 4 cars, but are extremely dangerous for Level 3.

Perhaps in light of these concerns, Consumer Reports recently called upon Tesla to disable and rename its AutoPilot function. Tesla, however, said they have no plans to do so and has argued that their cars are still safer than and less prone to accidents than those without autonomous driving controls. Instead, they plan to focus on driver education.

Now, you could argue pretty persuasively that the people doing these things on their Teslas are, frankly, idiots who may be headed towards a Darwinian demise, but I don’t think it’s that easy. A basic understanding of human nature suggests that once people move on to other activities, it’s hard to get them to quickly move back to something else. Add in the potential need for extremely fast transitions in potential life-or-death situations, and it’s easy to see why Level 3-type autonomy could be an extremely difficult or even unsolvable problem.

Level 4, on the other hand, is still a very valid and exciting goal, and something that lots of companies are going to be working on for some time. Realistically, it’s going to be a while before it comes to fruition, but it does provide the kind of safety outcomes that most people will likely expect (or demand) from autonomous cars. Anything in between Level 4 and the kinds of autonomous safety enhancements (Level 2.5) we’re starting to see, however, looks to be dangerous ground.

Arguably, airplanes and other types of vehicles do have autopilot modes that are very successful, but pilots have to go through a great deal of training in order to use them. To expect that anyone who wants to use a Level 3-capable autonomous car would require that kind of training is simply unrealistic—regardless of how much effort car makers employ to educate their customers.

From a purely technological perspective, the NHTSA autonomous driving guidelines do make logical sense. The progress from one level to the next is exactly how we can expect the technology to be developed. However, when you incorporate the human factor, those Levels start to look increasingly naïve, increasingly unrealistic, and increasingly unsafe. It’s time to take a fresh look.

Here's a link to the column: http://www.usatoday.com/story/tech/columnist/2016/07/19/semi-autonomous-driving-really-viable/87083214/

USA TODAY columnist Bob O'Donnell is the president and chief analyst of TECHnalysis Research, a market research and consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. His clients are major technology firms including Microsoft, HP, Dell, and Qualcomm. You can follow him on Twitter @bobodtech.